3,613 research outputs found

    Cognitive science and the cultural challenge

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/111778/1/soca12120.pd

    Using Supervised Principal Components Analysis to Assess Multiple Pollutant Effects

    Get PDF
    BACKGROUND: Many investigations of the adverse health effects of multiple air pollutants analyze the time series involved by simultaneously entering the multiple pollutants into a Poisson log-linear model. This method can yield unstable parameter estimates when the pollutants involved suffer high intercorrelation; therefore, traditional approaches to dealing with multicollinearity, such as principal component analysis (PCA), have been promoted in this context. OBJECTIVES: A characteristic of PCA is that its construction does not consider the relationship between the covariates and the adverse health outcomes. A refined version of PCA, supervised principal components analysis (SPCA), is proposed that specifically addresses this issue. METHODS: Models controlling for long-term trends and weather effects were used in conjunction with each SPCA and PCA to estimate the association between multiple air pollutants and mortality for U.S. cities. The methods were compared further via a simulation study. RESULTS: Simulation studies demonstrated that SPCA, unlike PCA, was successful in identifying the correct subset of multiple pollutants associated with mortality. Because of this property, SPCA and PCA returned different estimates for the relationship between air pollution and mortality. CONCLUSIONS: Although a number of methods for assessing the effects of multiple pollutants have been proposed, such methods can falter in the presence of high correlation among pollutants. Both PCA and SPCA address this issue. By allowing the exclusion of pollutants that are not associated with the adverse health outcomes from the mixture of pollutants selected, SPCA offers a critical improvement over PCA

    Modeling a 3-D multiscale blood-flow and heat-transfer framework for realistic vascular systems

    Get PDF
    Modeling of biological domains and simulation of biophysical processes occurring in them can help inform medical procedures. However, when considering complex domains such as large regions of the human body, the complexities of blood vessel branching and variation of blood vessel dimensions present a major modeling challenge. Here, we present a Voxelized Multi-Physics Simulation (VoM-PhyS) framework to simulate coupled heat transfer and fluid flow using a multi-scale voxel mesh on a biological domain obtained. In this framework, flow in larger blood vessels is modeled using the Hagen–Poiseuille equation for a one-dimensional flow coupled with a three-dimensional two-compartment porous media model for capillary circulation in tissue. The Dirac distribution function is used as Sphere of Influence (SoI) parameter to couple the one-dimensional and three-dimensional flow. This blood flow system is coupled with a heat transfer solver to provide a complete thermo-physiological simulation. The framework is demonstrated on a frog tongue and further analysis is conducted to study the effect of convective heat exchange between blood vessels and tissue, and the effect of SoI on simulation results.publishedVersio

    The IT framework of the European Archive of Historical Earthquake Data (AHEAD)

    Get PDF
    The European Archive of Historical EArthquake Data (AHEAD) has been developed in the frame of the EC project NERIES and maintained in the frame of the EC project SHARE.AHEAD makes available on the web the result of a networked historical earthquake data research, formalised in terms of studies (papers, reports, macroseismic data points, etc). It provides an updated wealth of data that are unique for many European events in the time-window 1000-1963.A series of IT solutions have been developed in order to support both the research and the networking activities carried out within the building process of AHEAD. The resulting framework is an equally balanced effort in both the back-end and front-end design and implementation, a key feature in a research approach very much human-centred, where the quantity of data is small if compared to terabytes of instrumental data.AHEAD is composed of five mutually dependent data-components: 1) the “Digital Library”, where all the historical earthquake studies are stored and described by bibliographical metadata, 2) the “Consensus Earthquake Inventory”, where the relevant macroseismic data (event date, epicentral area, number of macroseismic data-point, maximum observed intensity) are extrapolated, the best available information are selected and fake earthquakes are highlighted, 3) the “European Macroseismic Database”, where all the available macroseismic data-points (MDPs) are stored, 4) the “Parameters Laboratory”, where earthquakes parameterisation methods are applied to MDPs in order to obtain epicentral locations and magnitudes and 5) the “European Earthquake Catalogue”.The presentation will demonstrate the adopted IT solutions separately for the back-end and the front-end, both for the access-restricted website and the general-purpose implementation designed to be included in the “Earthquake Data Portal”, developed within the EC project NERIES, which targets a much broader scientific community
    corecore